A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM and Legacy Data

PLM and Legacy Data
Oleg
Oleg
26 July, 2010 | 2 min for reading

When I’m thinking about any PLM project, I can clearly see the step when data available in the organization need to be loaded into the system. This step is often underestimated from different standpoints: ability to gather and load information, availability of data definitions, availability of APIs and system performance. I had chance to write before about “legacy data import” as a one of the three major factors impacting mainstream PLM deployment.

Data Sources
I’d make a try to break down legacy data you can face during the implementation.

1. File Legacy
Existing document, drawings, CAD models, Office documents.  In most of the cases, these are “un-managed data resources”, that need to be collected, analyzed, imported and stored into the system

2. Relational Databases
In today’s enterprise landscape, lots of data are located into RDBMS system. You can find lots of legacy data here – starting from early dBase tables and going up to various versions database formats and systems Connections to these systems in most of the cases is very straightforward via SQL-compliant driver or software.

3. Computer and Application Legacy
Often, you have systems that were implemented and used or continue to be used by company now. For some reasons, the access of their data storage is problematic. In this case, the only way is to access these applications via an available API or reverse engineer  such data sources. Sometime, you can face old, but still functioning computer systems (mainframe is one of the best examples) that continue to operate and keep lots of valuable for organization information.

Import vs. Federation
These are two separate strategies about how to handle legacy data. You can keep the data in the original form and systems. You PLM system will be accessing the legacy data sources to get data, connect and transform it into a new form. The alternative option is to import data in a single shot into a new system. In this case, your legacy data becomes irrelevant, and you move into a new system. It is hard to say what is the best strategy. The situation needs to be estimated and assessed based on the system analyzes. However, I found legacy systems as something that very painful during implementation.

What is my conclusion? Legacy data is important. The amount of data is growing in the exponential manner. To handle legacy data and systems is a very painful task. Each time we come with new systems, the problem of legacy data comes up again. PLM needs to learn to handle foreign lifecycle data or lifecycle data produced by previous versions of PLM systems. It seems to me as a very important functionality that almost missed today. What is your opinion?

Best, Oleg

Recent Posts

Also on BeyondPLM

4 6
27 October, 2020

Moment PLM articles about the role, responsibilities, and desired skills of PLM architects caught my attention earlier this week. Check...

6 June, 2019

Technological innovation is fascinating. CBInsight Tech Trend 14 trends that shaping tech in 2019 report is full of absolutely crazy...

16 December, 2012

The simplicity of DropBox and similar cloud based file storage makes it very attractive to many people. Engineers are not...

20 October, 2010

The following message on twitter from Jonathan Scott of Razorleaf made me think about the future of vertical integration in...

15 August, 2022

It is no secret that PLM (product lifecycle management) and ERP systems are two of the most important systems within...

18 December, 2009

Siemens’s blog post by Nik Pakvasa and following discussion drove me to put my thoughts in the this direction- how...

17 October, 2022

Last week, I was recording a video podcast with Adam Keating, CEO of CoLab. We discussed modern tech trends, and...

2 December, 2008

I’d like to figure out several trends – CAD Data Management, Globalization and Cloud Storage Services.   Trend: CAD vendors...

30 January, 2010

This post was born as a consequence of on-going conversation with Jim Brown of TechClarity. Jim and I have a...

Blogroll

To the top